Twitter users are ready to dive into the heated online debate they will be warned they will enter the “intense” conversation, with safety trials.
The social media platform is testing the features that dropped notifications under potentially debated exchanges, stating: “Heads up. Conversations like this can be intense.” Another prompt, which seems to be aimed at people who make a reply, go to a larger length to calm users And urging tweeters to “pay attention to each other”, “remember humans” and note that “a variety of perspectives have value”.
The trial is being carried out with a small group of users, in English settings, on the Apple iOS platform.
In the testimony to the US Senator this week, Frances Whistleblower Facebook Haugen refers to Twitter’s efforts to spend heat from several interactions as an example that can be followed by his former employer. Haugen, who said Facebook was too focused on the platform “Twitchy” and “Viral”, said Twitter has reduced interaction of anger by introducing features that request users whether they do not wear links that they do not tweet.
The “intense” conversation test is the latest Twitter effort to limit abuse on its platform, problems that arise in a new focus on England this year after English soccer players harassed by Twitter users during the Euro 2020 tournament.
Other initiatives tested by US companies include: features that allow users to remove unwanted followers without officially blocking them; and “security mode” that blocks for seven days if the technology technology system sees them using dangerous language or send repetitive replies and is not invited. The safety mode feature was initially tested among small groups of users, with special emphasis on women’s journalists and marginalized community members.
Twitter is also considering giving users the ability to archive old tweets and delete it from the public display after a period of time specified, such as 30, 60 or 90 days.
Online harassment comes to a sharp legislative focus in the UK with the design of online safety bills, which impose care obligations to social media companies to protect users from dangerous content. Social media companies are needed under the draft bill to submit to Ofcom, Communication Supervisors, “Risk Assessment” content that causes damage. The committee with parliamentarians and peers researching bills and caused by reports at the end of the year.